Block Conjugate Gradient algorithms for least squares problems
نویسندگان
چکیده
منابع مشابه
Conjugate gradient total least - squares in geophysical optimiza - tion problems
Golub and Loan (1980) presented a numerically-stable TLS algorithm which utilizes the singular value decomposition (SVD). Subsequent refinements to the method predominantly use SVD, and much of the current literature emphasizes stabilization of the inverse and implicit model regularization by SVD truncation (Fierro et al., 1997). Because it is numerically intensive, however, the SVD generally p...
متن کاملGradient methods and conic least-squares problems
This paper presents two reformulations of the dual of the constrained least squares problem over convex cones. In addition, it extends Nesterov’s excessive gap method 1 [21] to more general problems. The conic least squares problem is then solved by applying the resulting modified method, or Nesterov’s smooth method [22], or Nesterov’s excessive gap method 2 [21], to the dual reformulations. Nu...
متن کاملLinear regression models, least-squares problems, normal equations, and stopping criteria for the conjugate gradient method
Minimum-variance unbiased estimates for linear regression models can be obtained by solving leastsquares problems. The conjugate gradient method can be successfully used in solving the symmetric and positive definite normal equations obtained from these least-squares problems. Taking into account the results of Golub and Meurant (1997, 2009) [10,11], Hestenes and Stiefel (1952) [17], and Strako...
متن کاملConjugate gradient acceleration of iteratively re-weighted least squares methods
Iteratively Re-weighted Least Squares (IRLS) is a method for solving minimization problems involving non-quadratic cost functions, perhaps non-convex and non-smooth, which however can be described as the infimum over a family of quadratic functions. This transformation suggests an algorithmic scheme that solves a sequence of quadratic problems to be tackled efficiently by tools of numerical lin...
متن کاملAn Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems
In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Computational and Applied Mathematics
سال: 2017
ISSN: 0377-0427
DOI: 10.1016/j.cam.2016.11.031